Refinement Inequalities among Symmetric Divergence Measures
نویسنده
چکیده
There are three classical divergence measures in the literature on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s J-divergence, Sibson-Burbea-Rao’s JensenShannon divegernce and Taneja’s arithemtic geometric mean divergence. These bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discrimination, symmetric χ−divergence, and triangular discrimination are not based on logarithmic expressions. These six divergence measures are symmetric with respect to probability distributions. In this paper some interesting inequalities among these symmetric divergence measures are studied. Refinements of these inequalities are also given. Some inequalities due to Dragomir et al. [6] are also improved.
منابع مشابه
A Sequence of Inequalities among Difference of Symmetric Divergence Measures
In this paper we have considered two one parametric generalizations. These two generalizations have in particular the well known measures such as: J-divergence, Jensen-Shannon divergence and arithmetic-geometric mean divergence. These three measures are with logarithmic expressions. Also, we have particular cases the measures such as: Hellinger discrimination, symmetric χ2−divergence, and trian...
متن کاملGeneralized Symmetric Divergence Measures and Inequalities
, s = 1 The first measure generalizes the well known J-divergence due to Jeffreys [16] and Kullback and Leibler [17]. The second measure gives a unified generalization of JensenShannon divergence due to Sibson [22] and Burbea and Rao [2, 3], and arithmeticgeometric mean divergence due to Taneja [27]. These two measures contain in particular some well known divergences such as: Hellinger’s discr...
متن کاملRelative Divergence Measures and Information Inequalities
There are many information and divergence measures exist in the literature on information theory and statistics. The most famous among them are Kullback-Leiber’s [17] relative information and Jeffreys [16] J-divergence, Information radius or Jensen difference divergence measure due to Sibson [23]. Burbea and Rao [3, 4] has also found its applications in the literature. Taneja [25] studied anoth...
متن کاملNested Inequalities Among Divergence Measures
In this paper we have considered an inequality having 11 divergence measures. Out of them three are logarithmic such as Jeffryes-Kullback-Leiber [4] [5] J-divergence. Burbea-Rao [1] Jensen-Shannon divergence and Taneja [7] arithmetic-geometric mean divergence. The other three are non-logarithmic such as Hellinger discrimination, symmetric χ−divergence, and triangular discrimination. Three more ...
متن کاملSome Inequalities Among New Divergence Measures
Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber J-divergence. Burbea-Rao [1] Jensen-Shannon divegernce and Taneja [8] arithmetic-geometric mean divergence. These three measures bear an interesting relationship among each other and are based on logarithmic expressions. The divergence m...
متن کامل